Table of Contents
BID DATA AND DEEP LEARNING INTRODUCTION
1.1 MATLAB AND BIG DATA
1.1.1 Access Data
1.1.2 Explore, Process, and Analyze Data
1.1.3 Develop Predictive Models
1.2 DEEP LEARNING
1.2.1 Definitions
1.2.2 Concepts
1.2.3 Deep learning and neural networks
1.2.4 Deep neural networks
1.2.5 Convolutional neural networks
1.2.6 Recursive neural networks
1.2.7 Long short-term memory
1.2.8 Deep belief networks
1.2.9 Convolutional deep belief networks
1.2.10 Large memory storage and retrieval neural networks
1.2.11 Deep Boltzmann machines
1.2.12 Encoder–decoder networks
1.2.13 Deep learning applications
1.3 DEEP LEARNING WITH MATLAB: NEURAL NETWORK TOOLBOX (DEEP LEARNING TOOLBOX)
1.4 Using DEEP LEARNING Toolbox
1.5 Automatic Script Generation
1.6 DEEP LEARNING Toolbox Applications
1.7 Neural Network Design Steps
DEEP LEARNING WITH MATLAB: CONVOLUTIONAL Neural NetworkS. FUNCTIONS
2.2.1 Create an image input layer: imageInputLayer
2.2.2 Create 2-D convolutional layer: convolution2dLayer
2.2.3 Create a Rectified Linear Unit (ReLU) layer: reluLayer
2.2.4 Create a local response normalization layer: crossChannelNormalizationLayer
2.2.5 Create an average pooling layer: averagePooling2dLayer
2.2.6 Create max pooling layer: maxPooling2dLayer
2.2.7 Create fully connected layer: fullyConnectedLayer
2.2.8 Create a dropout layer: dropoutLayer
2.2.9 Create a softmax layer softmaxLayer
2.2.10 Create a classification output layer: classificationLayer
2.3.1 Train network: trainNetwork
2.3.2 Options for training neural network. trainingOptions
2.4 Extract Features and Predict Outcomes. FUNCTIONS
2.4.1 Compute network layer activations: activations
2.4.2 Predict responses using a trained network: predict
2.4.3 Classify data using a trained network: classify
DEEP LEARNING WITH MATLAB: CONVOLUTIONAL Neural NetworkS. CLASSES
3.2.1 Network layer: layer
3.3.1 Series network class: SeriesNetwork
3.3.2 Training options for stochastic gradient descent with momentum. TrainingOptionsSGDM
3.4.1 ImageInputLayer class
3.4.2 Convolution2DLayer class
3.4.3 ReLULayer class
3.4.4 CrossChannelNormalizationLayer class
3.4.5 AveragePooling2DLayer class
3.4.6 MaxPooling2DLayer class
3.4.7 FullyConnectedLayer class
3.4.8 DropoutLayer class
3.4.9 SoftmaxLayer class
3.4.10 ClassificationOutputLayer class
DEEP LEARNING WITH MATLAB: Image Category Classification
4.1 Overview
4.2 Check System Requirements
4.3 Download Image Data
4.4 Load Images
4.5 Download Pre-trained Convolutional Neural Network (CNN)
4.6 Load Pre-trained CNN
4.7 Pre-process Images For CNN
4.8 Prepare Training and Test Image Sets
4.9 Extract Training Features Using CNN
4.10 Train A Multiclass SVM Classifier Using CNN Features
4.11 Evaluate Classifier
4.12 Try the Newly Trained Classifier on Test Images
4.13 References
DEEP LEARNING WITH MATLAB: Transfer Learning Using Convolutional Neural Networks AND PRETRAINED Convolutional Neural Networks
5.1 Transfer Learning Using Convolutional Neural Networks
5.2 Pretrained Convolutional Neural Network
DEEP LEARNING WITH MATLAB: FunctionS FOR PATTERN RECOGNITION AND CLASSIFICATION. AUTOENCODER
6.1 INTRODUCTION
6.2 view NEURAL NETWORK
6.3 Pattern Recognition and Learning Vector Quantization
6.3.1 Pattern recognition network: patternnet
6.3.2 Learning vector quantization neural network: lvqnet
6.4 Training Options and Network Performance
6.4.1 Receiver operating characteristic: roc
6.4.2 Plot receiver operating characteristic: plotroc
6.4.3 Plot classification confusion matrix: plotconfusion
6.4.4 Neural network performance: crossentropy
6.4.5 Construct and Train a Function Fitting Network
6.4.6 Create and train Feedforward Neural Network
6.4.7 Create and Train a Cascade Network
6.5 Network performance
6.5.1 Description
6.5.2 Examples
6.6 Fit Regression Model and Plot Fitted Values versus Targets
6.6.1 Description
6.6.2 Examples
6.7 Plot Output and Target Values
6.7.1 Description
6.7.2 Examples
6.8 Plot Training State Values
6.9 Plot Performances
6.10 Plot Histogram of Error Values
6.10.1 Syntax
6.10.2 Description
6.10.3 Examples
6.11 Generate MATLAB function for simulating neural network
6.11.1 Create Functions from Static Neural Network
6.11.2 Create Functions from Dynamic Neural Network
6.12 A COMPLETE EXAMPLE: House Price Estimation
6.12.1 The Problem: Estimate House Values
6.12.2 Why Neural Networks?
6.12.3 Preparing the Data
6.12.4 Fitting a Function with a Neural Network
6.12.5 Testing the Neural Network
6.13 Autoencoder class
6.14 Autoencoder FUNCTIONS
6.14.1 Functions
6.14.2 trainAutoencoder
6.14.3 decode
6.14.4 encode
6.14.5 predict
6.14.6 stack
6.14.7 generateFunction
6.14.8 generateSimulink
6.14.9 plotWeights
6.14.10 view
6.15 Construct Deep Network Using Autoencoders
DEEP LEARNING WITH MATLAB: MULTILAYER Neural Network
7.1 Create, Configure, and Initialize Multilayer Neural Networks
7.1.1 Other Related Architectures
7.2 FUNCTIONS FOR Create, Configure, and Initialize Multilayer Neural Networks
7.2.1 Initializing Weights (init)
7.2.2 feedforwardnet
7.2.3 configure
7.2.4 init
7.2.5 train
7.2.6 trainlm
7.2.7 tansig
7.2.8 purelin
7.2.9 cascadeforwardnet
7.2.10 patternnet
7.3 Train and Apply Multilayer Neural Networks
7.3.1 Training Algorithms
7.3.2 Training Example
7.3.3 Use the Network
7.4 Train ALGORITMS IN Multilayer Neural Networks
7.4.1 trainbr:Bayesian Regularization
7.4.2 trainscg: Scaled conjugate gradient backpropagation
7.4.3 trainrp: Resilient backpropagation
7.4.4 trainbfg: BFGS quasi-Newton backpropagation
7.4.5 traincgb: Conjugate gradient backpropagation with Powell-Beale restarts
7.4.6 traincgf: Conjugate gradient backpropagation with Fletcher-Reeves updates
7.4.7 traincgp: Conjugate gradient backpropagation with Polak-Ribiére updates
7.4.8 trainoss: One-step secant backpropagation
7.4.9 traingdx: Gradient descent with momentum and adaptive learning rate backpropagation
7.4.10 traingdm: Gradient descent with momentum backpropagation
7.4.11 traingd: Gradient descent backpropagation
DEEP LEARNING WITH MATLAB: ANALYZE AND DEPLOY TRAINED NEURAL NETWORK
8.1 ANALYZE NEURAL NETWORK PERFORMANCE
8.2 Improving Results
8.3 Deployment Functions and Tools for Trained Networks
8.4 Generate Neural Network Functions for Application Deployment
8.5 Deploy Neural Network Simulink Diagrams
8.5.1 Example
8.5.2 Suggested Exercises
8.6 Deploy Training of Neural Networks
TRAINING SCALABILITY AND EFICIENCE
9.1 Neural Networks with Parallel and GPU Computing
9.1.1 Modes of Parallelism
9.1.2 Distributed Computing
9.1.3 Single GPU Computing
9.1.4 Distributed GPU Computing
9.1.5 Deep Learning
9.1.6 Parallel Time Series
9.1.7 Parallel Availability, Fallbacks, and Feedback
9.2 Automatically Save Checkpoints During Neural Network Training
9.3 Optimize Neural Network Training Speed and Memory
9.3.1 Memory Reduction
9.3.2 Fast Elliot Sigmoid
DEEP LEARNING WITH MATLAB: OPTIMAL SOLUTIONS
10.1 Representing Unknown or Don’t-Care Targets
10.1.1 Choose Neural Network Input-Output Processing Functions
10.1.2 Representing Unknown or Don’t-Care Targets
10.2 Configure Neural Network Inputs and Outputs
10.3 Divide Data for Optimal Neural Network Training
10.4 Choose a Multilayer Neural Network Training Function
10.4.1 SIN Data Set
10.4.2 PARITY Data Set
10.4.3 ENGINE Data Set
10.4.4 CANCER Data Set
10.4.5 CHOLESTEROL Data Set
10.4.6 DIABETES Data Set
10.4.7 Summary
10.5 Improve Neural Network Generalization and Avoid Overfitting
10.5.1 Retraining Neural Networks
10.5.2 Multiple Neural Networks
10.5.3 Early Stopping
10.5.4 Index Data Division (divideind)
10.5.5 Random Data Division (dividerand)
10.5.6 Block Data Division (divideblock)
10.5.7 Interleaved Data Division (divideint)
10.5.8 Regularization
10.5.9 Modified Performance Function
10.5.10 Automated Regularization (trainbr)
10.5.11 Summary and Discussion of Early Stopping and Regularization
10.5.12 Posttraining Analysis (regression)
10.6 Train Neural Networks with Error Weights
10.7 Normalize Errors of Multiple Outputs
DEEP LEARNING WITH MATLAB: CLASSIFICATION WITH NEURAL NETWORKS. EXAMPLES
11.1 Crab Classification
11.1.1 Why Neural Networks?
11.1.2 Preparing the Data
11.1.3 Building the Neural Network Classifier
11.1.4 Testing the Classifier
11.2 Wine Classification
11.2.1 The Problem: Classify Wines
11.2.2 Why Neural Networks?
11.2.3 Preparing the Data
11.2.4 Pattern Recognition with a Neural Network
11.2.5 Testing the Neural Network
11.3 Cancer Detection
11.3.1 Formatting the Data
11.3.2 Ranking Key Features
11.3.3 Classification Using a Feed Forward Neural Network
11.4 Character Recognition
11.4.1 Creating the First Neural Network
11.4.2 Training the first Neural Network
11.4.3 Training the Second Neural Network
11.4.4 Testing Both Neural Networks
DEEP LEARNING WITH MATLAB: AUTOENCODERS AND CLUSTERING WITH NEURAL NETWORKS. EXAMPLES
12.1 Train Stacked Autoencoders for Image Classification
12.1.1 Data set
12.1.2 Training the first autoencoder
12.1.3 Visualizing the weights of the first autoencoder
12.1.4 Training the second autoencoder
12.1.5 Training the final softmax layer
12.1.6 Forming a stacked neural network
12.1.7 Fine tuning the deep neural network
12.1.8 Summary
12.2 Transfer Learning Using Convolutional Neural Networks
12.3 Iris Clustering
12.3.1 Why Self-Organizing Map Neural Networks?
12.3.2 Preparing the Data
12.3.3 Clustering with a Neural Network
12.4 Gene Expression Analysis
12.4.1 The Problem: Analyzing Gene Expressions in Baker’s Yeast (Saccharomyces Cerevisiae)
12.4.2 The Data
12.4.3 Filtering the Genes
12.4.4 Principal Component Analysis
12.4.5 Cluster Analysis: Self-Organizing Maps